456 research outputs found

    An exact Jacobi map in the geodesic light-cone gauge

    Full text link
    The remarkable properties of the recently proposed geodesic light-cone (GLC) gauge allow to explicitly solve the geodetic-deviation equation, and thus to derive an exact expression for the Jacobi map J^A_B(s,o) connecting a generic source s to a geodesic observer o in a generic space time. In this gauge J^A_B factorizes into the product of a local quantity at s times one at o, implying similarly factorized expressions for the area and luminosity distance. In any other coordinate system J^A_B is simply given by expressing the GLC quantities in terms of the corresponding ones in the new coordinates. This is explicitly done, at first and second order, respectively, for the synchronous and Poisson gauge-fixing of a perturbed, spatially-flat cosmological background, and the consistency of the two outcomes is checked. Our results slightly amend previous calculations of the luminosity-redshift relation and suggest a possible non-perturbative way for computing the effects of inhomogeneities on observations based on light-like signals.Comment: 26 pages, no figures. Inconsequential modification of an equation, comments and references added. Version accepted for publication in JCA

    A new approach to the propagation of light-like signals in perturbed cosmological backgrounds

    Full text link
    We present a new method to compute the deflection of light rays in a perturbed FLRW geometry. We exploit the properties of the Geodesic Light Cone (GLC) gauge where null rays propagate at constant angular coordinates irrespectively of the given (inhomogeneous and/or anisotropic) geometry. The gravitational deflection of null geodesics can then be obtained, in any other gauge, simply by expressing the angular coordinates of the given gauge in terms of the GLC angular coordinates. We apply this method to the standard Poisson gauge, including scalar perturbations, and give the full result for the deflection effect in terms of the direction of observation and observed redshift up to second order, and up to third order for the leading lensing terms. We also compare our results with those presently available in the literature and, in particular, we provide a new non trivial check of a previous result on the luminosity-redshft relation up to second order in cosmological perturbation theory.Comment: 37 pages, no figures. Typos corrected, comments and references added. Version accepted for publication in JCA

    Do stochastic inhomogeneities affect dark-energy precision measurements?

    Full text link
    The effect of a stochastic background of cosmological perturbations on the luminosity-redshift relation is computed to second order through a recently proposed covariant and gauge-invariant light-cone averaging procedure. The resulting expressions are free from both ultraviolet and infrared divergences, implying that such perturbations cannot mimic a sizable fraction of dark energy. Different averages are estimated and depend on the particular function of the luminosity distance being averaged. The energy flux, being minimally affected by perturbations at large z, is proposed as the best choice for precision estimates of dark-energy parameters. Nonetheless, its irreducible (stochastic) variance induces statistical errors on \Omega_{\Lambda}(z) typically lying in the few-percent range.Comment: 5 pages, 3 figures. Comments and references added. Typos corrected. Version accepted for publication in Phys. Rev. Let

    Robust weighted aggregation of expert opinions in futures studies

    Get PDF
    Expert judgments are widespread in many fields, and the way in which they are collected and the procedure by which they are aggregated are considered crucial steps. From a statistical perspective, expert judgments are subjective data and must be gathered and treated as carefully and scientifically as possible. In the elicitation phase, a multitude of experts is preferable to a single expert, and techniques based on anonymity and iterations, such as Delphi, offer many advantages in terms of reducing distortions, which are mainly related to cognitive biases. There are two approaches to the aggregation of the judgments given by a panel of experts, referred to as behavioural (implying an interaction between the experts) and mathematical (involving non-interacting participants and the aggregation of the judgments using a mathematical formula). Both have advantages and disadvantages, and with the mathematical approach, the main problem concerns the subjective choice of an appropriate formula for both normalization and aggregation. We propose a new method for aggregating and processing subjective data collected using the Delphi method, with the aim of obtaining robust rankings of the outputs. This method makes it possible to normalize and aggregate the opinions of a panel of experts, while modelling different sources of uncertainty. We use an uncertainty analysis approach that allows the contemporaneous use of different aggregation and normalization functions, so that the result does not depend on the choice of a specific mathematical formula, thereby solving the problem of choice. Furthermore, we can also model the uncertainty related to the weighting system, which reflects the different expertise of the participants as well as expert opinion accuracy. By combining the Delphi method with the robust ranking procedure, we offer a new protocol covering the elicitation, the aggregation and the processing of subjective data used in the construction of Delphi-based future scenarios. The method is very flexible and can be applied to the aggregation and processing of any subjective judgments, i.e. also those outside the context of futures studies. Finally, we show the validity, reproducibility and potential of the method through its application with regard to the future of Italian families

    Primordial Black Holes from Pre-Big Bang inflation

    Full text link
    We discuss the possibility of producing a significant fraction of dark matter in the form of primordial black holes in the context of the pre-big bang inflationary scenario. We take into account, to this purpose, the enhancement of curvature perturbations possibly induced by a variation of the sound-speed parameter csc_s during the string phase of high-curvature inflation. After imposing all relevant observational constraints, we find that the considered class of models is compatible with the production of a large amount of primordial black holes in the mass range relevant to dark matter, provided the sound-speed parameter is confined in a rather narrow range of values, 0.003<cs<0.010.003 < c_s < 0.01.Comment: 26 pages, two figures. Many new references and a few comments added. Version accepted for publication in JCA

    Primordial black holes from pre-big bang inflation

    Get PDF
    We discuss the possibility of producing a significant fraction of dark matter in the form of primordial black holes in the context of the pre-big bang inflationary scenario. We take into account, to this purpose, the enhancement of curvature perturbations possibly induced by a variation of the sound-speed parameter cs during the string phase of high-curvature inflation. After imposing all relevant observational constraints, we find that the considered class of models is compatible with the production of a large amount of primordial black holes in the mass range relevant to dark matter, provided the sound-speed parameter is confined in a rather narrow range of values, 0.003 cs 0.01

    A covariant and gauge invariant formulation of the cosmological "backreaction"

    Full text link
    Using our recent proposal for defining gauge invariant averages we give a general-covariant formulation of the so-called cosmological "backreaction". Our effective covariant equations allow us to describe in explicitly gauge invariant form the way classical or quantum inhomogeneities affect the average evolution of our Universe.Comment: 12 pages, no figures. Typos corrected, matches version to appear in JCA

    A Method to Address the Effectiveness of the SIC Code for Selecting Comparable Firms

    Get PDF
    To find peer firms is very important in several situations, for example in equity valuation for publicly traded firms, as well as for not publicly traded ones. Very often the pay of a chief executive officer (CEO) is set at the basis of a peer compensation group. Financial policies are often driven by a response to peers. It is a very common approach to use industry membership given by the SEC (United States Security and Exchange Commission) SIC (Standard Industrial Classification) code to form peer groups. In the paper the effectiveness of the SIC code for selecting comparable firms is evaluated through nonparametric testing for difference in firm financial ratios

    Light-cone averaging in cosmology: formalism and applications

    Full text link
    We present a general gauge invariant formalism for defining cosmological averages that are relevant for observations based on light-like signals. Such averages involve either null hypersurfaces corresponding to a family of past light-cones or compact surfaces given by their intersection with timelike hypersurfaces. Generalized Buchert-Ehlers commutation rules for derivatives of these light-cone averages are given. After introducing some adapted "geodesic light-cone" coordinates, we give explicit expressions for averaging the redshift to luminosity-distance relation and the so-called "redshift drift" in a generic inhomogeneous Universe.Comment: 20 pages, 2 figures. Comments and references added, typos corrected. Version accepted for publication in JCA
    corecore